Algorithmic Information Theory and Foundations of Probability

نویسنده

  • Alexander Shen
چکیده

The question how and why mathematical probability theory can be applied to the “real world” has been debated for centuries. We try to survey the role of algorithmic information theory (Kolmogorov complexity) in this debate. 1 Probability theory paradox One often describes the natural sciences framework as follows: a hypothesis is used to predict something, and the prediction is then checked against the observed actual behavior of the system; if there is a contradiction, the hypothesis needs to be changed. Can we include probability theory in this framework? A statistical hypothesis (say, the assumption of a fair coin) should be then checked against the experimental data (results of coin tossing) and rejected if some discrepancy is found. However, there is an obvious problem: The fair coin assumption says that in a series of, say, 1000 coin tossings all the 21000 possible outcomes (all 21000 bit strings of length 1000) have the same probability 2−1000. How can we say that some of them contradict the assumption while other do not? The same paradox can be explained in a different way. Consider a casino that wants to outsource the task of card shuffling to a special factory that produced shrink-wrapped well shuffled decks of cards. This factory would need some quality control department. It looks at the deck before shipping it to the customer, blocks some “badly shuffled” decks and approves some others as “well shuffled”. But how is it possible if all n! orderings of n cards have the same probability? 2 Current best practice Whatever the philosophers say, statisticians have to perform their duties. Let us try to provide a description of their current “best practice” (see [7, 8]). A. How a statistical hypothesis is applied. First of all, we have to admit that probability theory makes no predictions but only gives recommendations: if the probability (computed on the basis of ∗LIF Marseille, CNRS & Univ. Aix–Marseille. On leave from IITP, RAS, Moscow. Supported in part by NAFIT ANR-08-EMER-008-01 grant. E-mail: [email protected]

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Probabilistic Sufficiency and Algorithmic Sufficiency from the point of view of Information Theory

‎Given the importance of Markov chains in information theory‎, ‎the definition of conditional probability for these random processes can also be defined in terms of mutual information‎. ‎In this paper‎, ‎the relationship between the concept of sufficiency and Markov chains from the perspective of information theory and the relationship between probabilistic sufficiency and algorithmic sufficien...

متن کامل

Algorithmic information theory and martingales∗

1. A.N. Kolmogorov, Several theorems about algorithmic entropy and algorithmic amount of information. Algorithmic approach to the foundations of information theory and probability theory was not developed far in several years from its appearance since some questions raised at the very start remained unanswered. Now the situation has changed somewhat. In particular, it is ascertained that the de...

متن کامل

Uncertainty and Information: Foundations of Generalized Information Theory (a book review)

We present a review of the book by George J. Klir, Uncertainty and Information: Foundations of Generalized Information Theory, Wiley, Hoboken, New Jersey 2006, ISBN: 0-471-74867-6, 520 pp.

متن کامل

Shannon entropy: a rigorous mathematical notion at the crossroads between probability, information theory, dynamical systems and statistical physics

Statistical entropy was introduced by Shannon as a basic concept in information theory, measuring the average missing information on a random source. Extended into an entropy rate, it gives bounds in coding and compression theorems. I here present how statistical entropy and entropy rate relate to other notions of entropy, relevant either to probability theory (entropy of a discrete probability...

متن کامل

Lecture notes on descriptional complexity and randomness

A didactical survey of the foundations of Algorithmic Information Theory. These notes are short on motivation, history and background but introduce some of the main techniques and concepts of the field. The “manuscript” has been evolving over the years. Please, look at “Version history” below to see what has changed when.

متن کامل

Kolmogorov's Contributions to the Foundations of Probability

Andrei Nikolaevich Kolmogorov was the foremost contributor to the mathematical and philosophical foundations of probability in the twentieth century, and his thinking on the topic is still potent today. In this article we first review the three stages of Kolmogorov’s work on the foundations of probability: (1) his formulation of measure-theoretic probability, 1933, (2) his frequentist theory of...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009